Minimax Lower Bounds for Noisy Matrix Completion Under Sparse Factor Models
نویسندگان
چکیده
This paper examines fundamental error characteristics for a general class of matrix completion problems, where matrix of interest is a product of two a priori unknown matrices, one of which is sparse, and the observations are noisy. Our main contributions come in the form of minimax lower bounds for the expected per-element squared error for these problems under several noise/corruption models; specifically, we analyze scenarios where the corruptions are characterized by additive Gaussian noise or additive heavier-tailed (Laplace) noise, Poisson-distributed observations, and highly-quantized (e.g., one-bit) observations. Our results establish that the error bounds derived in (Soni et al., 2014) for complexity-regularized maximum likelihood estimators achieve, up to multiplicative constant and logarithmic factors, the minimax error rates in each of these noise scenarios, provided the sparse factor exhibits linear sparsity.
منابع مشابه
Multi-Step Stochastic ADMM in High Dimensions: Applications in Sparse Optimization and Noisy Matrix Decomposition
We propose an efficient ADMM method with guarantees for high-dimensional problems. We provide explicit bounds for the sparse optimization problem and the noisy matrix decomposition problem. For sparse optimization, we establish that the modified ADMM method has an optimal regret bound of O(s log d/T ), where s is the sparsity level, d is the data dimension and T is the number of steps. This mat...
متن کاملMulti-Step Stochastic ADMM in High Dimensions: Applications to Sparse Optimization and Noisy Matrix Decomposition
We propose an efficient ADMM method with guarantees for high-dimensional problems. We provide explicit bounds for the sparse optimization problem and the noisy matrix decomposition problem. For sparse optimization, we establish that the modified ADMM method has an optimal regret bound of O(s log d/T ), where s is the sparsity level, d is the data dimension and T is the number of steps. This mat...
متن کاملMinimax Bounds for Sparse Pca with Noisy High-dimensional Data by Aharon Birnbaum,
We study the problem of estimating the leading eigenvectors of a highdimensional population covariance matrix based on independent Gaussian observations. We establish a lower bound on the minimax risk of estimators under the l2 loss, in the joint limit as dimension and sample size increase to infinity, under various models of sparsity for the population eigenvectors. The lower bound on the risk...
متن کاملBounds for Sparse Pca with Noisy High - Dimensional Data
We study the problem of estimating the leading eigenvectors of a high-dimensional population covariance matrix based on independent Gaussian observations. We establish a lower bound on the minimax risk of estimators under the l2 loss, in the joint limit as dimension and sample size increase to infinity, under various models of sparsity for the population eigenvectors. The lower bound on the ris...
متن کاملNoisy Matrix Decomposition via Convex Relaxation: Optimal Rates in High Dimensions1 by Alekh Agarwal2, Sahand Negahban3 And
We analyze a class of estimators based on convex relaxation for solving high-dimensional matrix decomposition problems. The observations are noisy realizations of a linear transformation X of the sum of an (approximately) low rank matrix with a second matrix endowed with a complementary form of low-dimensional structure; this set-up includes many statistical models of interest, including factor...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1510.00701 شماره
صفحات -
تاریخ انتشار 2015